93 research outputs found

    Phase Transition and Strong Predictability

    Full text link
    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed in our former work [K. Tadaki, Local Proceedings of CiE 2008, pp.425-434, 2008], where we introduced the notion of thermodynamic quantities into AIT. These quantities are real functions of temperature T>0. The values of all the thermodynamic quantities diverge when T exceeds 1. This phenomenon corresponds to phase transition in statistical mechanics. In this paper we introduce the notion of strong predictability for an infinite binary sequence and then apply it to the partition function Z(T), which is one of the thermodynamic quantities in AIT. We then reveal a new computational aspect of the phase transition in AIT by showing the critical difference of the behavior of Z(T) between T=1 and T<1 in terms of the strong predictability for the base-two expansion of Z(T).Comment: 5 pages, LaTeX2e, no figure

    Numerical Investigation of Graph Spectra and Information Interpretability of Eigenvalues

    Full text link
    We undertake an extensive numerical investigation of the graph spectra of thousands regular graphs, a set of random Erd\"os-R\'enyi graphs, the two most popular types of complex networks and an evolving genetic network by using novel conceptual and experimental tools. Our objective in so doing is to contribute to an understanding of the meaning of the Eigenvalues of a graph relative to its topological and information-theoretic properties. We introduce a technique for identifying the most informative Eigenvalues of evolving networks by comparing graph spectra behavior to their algorithmic complexity. We suggest that extending techniques can be used to further investigate the behavior of evolving biological networks. In the extended version of this paper we apply these techniques to seven tissue specific regulatory networks as static example and network of a na\"ive pluripotent immune cell in the process of differentiating towards a Th17 cell as evolving example, finding the most and least informative Eigenvalues at every stage.Comment: Forthcoming in 3rd International Work-Conference on Bioinformatics and Biomedical Engineering (IWBBIO), Lecture Notes in Bioinformatics, 201

    Universal fluctuations in subdiffusive transport

    Get PDF
    Subdiffusive transport in tilted washboard potentials is studied within the fractional Fokker-Planck equation approach, using the associated continuous time random walk (CTRW) framework. The scaled subvelocity is shown to obey a universal law, assuming the form of a stationary Levy-stable distribution. The latter is defined by the index of subdiffusion alpha and the mean subvelocity only, but interestingly depends neither on the bias strength nor on the specific form of the potential. These scaled, universal subvelocity fluctuations emerge due to the weak ergodicity breaking and are vanishing in the limit of normal diffusion. The results of the analytical heuristic theory are corroborated by Monte Carlo simulations of the underlying CTRW

    Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics

    Full text link
    In this philosophical paper, we explore computational and biological analogies to address the fine-tuning problem in cosmology. We first clarify what it means for physical constants or initial conditions to be fine-tuned. We review important distinctions such as the dimensionless and dimensional physical constants, and the classification of constants proposed by Levy-Leblond. Then we explore how two great analogies, computational and biological, can give new insights into our problem. This paper includes a preliminary study to examine the two analogies. Importantly, analogies are both useful and fundamental cognitive tools, but can also be misused or misinterpreted. The idea that our universe might be modelled as a computational entity is analysed, and we discuss the distinction between physical laws and initial conditions using algorithmic information theory. Smolin introduced the theory of "Cosmological Natural Selection" with a biological analogy in mind. We examine an extension of this analogy involving intelligent life. We discuss if and how this extension could be legitimated. Keywords: origin of the universe, fine-tuning, physical constants, initial conditions, computational universe, biological universe, role of intelligent life, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres

    Polynomial iterative algorithms for coloring and analyzing random graphs

    Get PDF
    We study the graph coloring problem over random graphs of finite average connectivity cc. Given a number qq of available colors, we find that graphs with low connectivity admit almost always a proper coloring whereas graphs with high connectivity are uncolorable. Depending on qq, we find the precise value of the critical average connectivity cqc_q. Moreover, we show that below cqc_q there exist a clustering phase c[cd,cq]c\in [c_d,c_q] in which ground states spontaneously divide into an exponential number of clusters. Furthermore, we extended our considerations to the case of single instances showing consistent results. This lead us to propose a new algorithm able to color in polynomial time random graphs in the hard but colorable region, i.e when c[cd,cq]c\in [c_d,c_q].Comment: 23 pages, 10 eps figure

    A Behavioural Foundation for Natural Computing and a Programmability Test

    Full text link
    What does it mean to claim that a physical or natural system computes? One answer, endorsed here, is that computing is about programming a system to behave in different ways. This paper offers an account of what it means for a physical system to compute based on this notion. It proposes a behavioural characterisation of computing in terms of a measure of programmability, which reflects a system's ability to react to external stimuli. The proposed measure of programmability is useful for classifying computers in terms of the apparent algorithmic complexity of their evolution in time. I make some specific proposals in this connection and discuss this approach in the context of other behavioural approaches, notably Turing's test of machine intelligence. I also anticipate possible objections and consider the applicability of these proposals to the task of relating abstract computation to nature-like computation.Comment: 37 pages, 4 figures. Based on an invited Talk at the Symposium on Natural/Unconventional Computing and its Philosophical Significance, Alan Turing World Congress 2012, Birmingham, UK. http://link.springer.com/article/10.1007/s13347-012-0095-2 Ref. glitch fixed in 2nd. version; Philosophy & Technology (special issue on History and Philosophy of Computing), Springer, 201

    G\"odel Incompleteness and the Black Hole Information Paradox

    Full text link
    Semiclassical reasoning suggests that the process by which an object collapses into a black hole and then evaporates by emitting Hawking radiation may destroy information, a problem often referred to as the black hole information paradox. Further, there seems to be no unique prediction of where the information about the collapsing body is localized. We propose that the latter aspect of the paradox may be a manifestation of an inconsistent self-reference in the semiclassical theory of black hole evolution. This suggests the inadequacy of the semiclassical approach or, at worst, that standard quantum mechanics and general relavity are fundamentally incompatible. One option for the resolution for the paradox in the localization is to identify the G\"odel-like incompleteness that corresponds to an imposition of consistency, and introduce possibly new physics that supplies this incompleteness. Another option is to modify the theory in such a way as to prohibit self-reference. We discuss various possible scenarios to implement these options, including eternally collapsing objects, black hole remnants, black hole final states, and simple variants of semiclassical quantum gravity.Comment: 14 pages, 2 figures; revised according to journal requirement

    Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno's Theorem

    Full text link
    In classical information theory, entropy rate and Kolmogorov complexity per symbol are related by a theorem of Brudno. In this paper, we prove a quantum version of this theorem, connecting the von Neumann entropy rate and two notions of quantum Kolmogorov complexity, both based on the shortest qubit descriptions of qubit strings that, run by a universal quantum Turing machine, reproduce them as outputs.Comment: 26 pages, no figures. Reference to publication added: published in the Communications in Mathematical Physics (http://www.springerlink.com/content/1432-0916/
    corecore